|
In computer science, data validation is the process of ensuring that a program operates on clean, correct and useful data. It uses routines, often called "validation rules" "validation constraints" or "check routines", that check for correctness, meaningfulness, and security of data that are input to the system. The rules may be implemented through the automated facilities of a data dictionary, or by the inclusion of explicit application program validation logic. == Overview == Data validation is intended to provide certain well-defined guarantees for fitness, accuracy, and consistency for any of various kinds of user input into an application or automated system. Data validation rules can be defined and designed using any of various methodologies, and be deployed in any of various contexts. Data validation rules may be defined, designed and deployed, for example: Definition and design contexts: * as a part of requirements-gathering phase in a software engineering or designing a software specification * as part of an operations modeling phase in business process modeling Deployment contexts: * as part of a user-interface * as a set of programs or business-logic routines in a programming language * as a set of stored-procedures in a database management system For business applications, data validation can be defined through declarative data integrity rules, or procedure-based business rules.〔(Data Validation, Data Integrity, Designing Distributed Applications with Visual Studio .NET )〕 Data that does not conform to these rules will negatively affect business process execution. Therefore, data validation should start with business process definition and set of business rules within this process. Rules can be collected through the requirements capture exercise.〔Arkady Maydanchik (2007), "Data Quality Assessment", Technics Publications, LLC〕 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Data validation」の詳細全文を読む スポンサード リンク
|